Rényi divergence measures for commonly used univariate continuous distributions

نویسندگان

  • M. Gil
  • Fady Alajaji
  • Tamás Linder
چکیده

Probabilistic ‘distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, have been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. This paper presents closed-form expressions for the Rényi and Kullback-Leibler divergences for nineteen commonly used univariate continuous distributions as well as those for multivariate Gaussian and Dirichlet distributions. In addition, a table summarizing four of the most important information measure rates for zero-mean stationary Gaussian processes, namely Rényi entropy, differential Shannon entropy, Rényi divergence, and Kullback-Leibler divergence, is presented. Lastly, a connection between the Rényi divergence and the variance of the log-likelihood ratio of two distributions is established, thereby extending a previous result by Song [J. Stat. Plan. Infer. 93 (2001)] on the relation between Rényi entropy and the log-likelihood function. A table with the corresponding variance expressions for the univariate distributions considered here is also included.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Rényi Divergence Measures for Continuous Alphabet Sources

The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi ent...

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Image Registration and Segmentation by Maximizing the Jensen-Rényi Divergence

Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...

متن کامل

Tighter Security for Efficient Lattice Cryptography via the Rényi Divergence of Optimized Orders

In security proofs of lattice based cryptography, bounding the closeness of two probability distributions is an important procedure. To measure the closeness, the Rényi divergence has been used instead of the classical statistical distance. Recent results have shown that the Rényi divergence offers security reductions with better parameters, e.g. smaller deviations for discrete Gaussian distrib...

متن کامل

A Few Characterizations of the Univariate Continuous Distributions

Various characterizations of distributions, in their generality, are presented in terms of the conditional expectations. Some special  examples are given as well.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Inf. Sci.

دوره 249  شماره 

صفحات  -

تاریخ انتشار 2013